Think Globally, Embed Locally - Locally Linear Meta-embedding of Words

نویسندگان

  • Danushka Bollegala
  • Kohei Hayashi
  • Ken-ichi Kawarabayashi
چکیده

Distributed word embeddings have shown superior performances in numerous Natural Language Processing (NLP) tasks. However, their performances vary significantly across different tasks, implying that the word embeddings learnt by those methods capture complementary aspects of lexical semantics. Therefore, we believe that it is important to combine the existing word embeddings to produce more accurate and complete meta-embeddings of words. For this purpose, we propose an unsupervised locally linear meta-embedding learning method that takes pre-trained word embeddings as the input, and produces more accurate meta embeddings. Unlike previously proposed meta-embedding learning methods that learn a global projection over all words in a vocabulary, our proposed method is sensitive to the differences in local neighbourhoods of the individual source word embeddings. Moreover, we show that vector concatenation, a previously proposed highly competitive baseline approach for integrating word embeddings, can be derived as a special case of the proposed method. Experimental results on semantic similarity, word analogy, relation classification, and short-text classification tasks show that our meta-embeddings to significantly outperform prior methods in several benchmark datasets, establishing a new state of the art for meta-embeddings.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Image Contrast Gain Control by Linear Neighbourhood Embedding

In this paper, we present a method that adaptively computes a contrast gain control map for the image through the use of a novel technique termed linear neighborhood embedding (LNE) which first computes a locally linear relation for each pixel and its neighbors and then embeds these relations globally in the gain map image. We borrow the " think globally fit locally " concept and computational ...

متن کامل

Embeddings of Locally Connected Compacta

Let A' be a ^-dimensional compactum and /: X -» M" a map into a piecewise linear n-manifold. n > k + 3. The main result of this paper asserts that if X is locally (2k ^-connected and / is (2k n + l)-connected, then / is homotopic to a CE equivalence. In particular, every ^--dimensional, /-connected, locally /--connected compactum is CE equivalent to a compact subset of R2*~r as long as r < k 3....

متن کامل

Built-In Learner Participation Potential of Locally- and Globally-Designed ELT Materials

This study aims at empirically measuring a universal criterion for materials evaluation, i.e., learning opportunities, in a locally- and a globally-designed materials. Adopting the conceptual framework of sociocultural theory and its conceptualization of learning as participation (Donato, 2000), the researchers utilized the methodological power of conversation analysis to examine how opportunit...

متن کامل

Think Globally, Fit Locally: Unsupervised Learning of Nonlinear Manifolds

The problem of dimensionality reduction arises in many fields of information processing, including machine learning, data compression, scientific visualization, pattern recognition, and neural computation. Here we describe locally linear embedding (LLE), an unsupervised learning algorithm that computes low dimensional, neighborhood preserving embeddings of high dimensional data. The data, assum...

متن کامل

Locally Linear Embedded Eigenspace Analysis

The existing nonlinear local methods for dimensionality reduction yield impressive results in data embedding and manifold visualization. However, they also open up the problem of how to define a unified projection from new data to the embedded subspace constructed by the training samples. Thinking globally and fitting locally, we present a new linear embedding approach, called Locally Embedded ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1709.06671  شماره 

صفحات  -

تاریخ انتشار 2017